10 DevOps strategies for working with legacy databases

Comments 0

Share to social media

Database teams often maintain large and complex legacy databases that they must keep operational to support existing applications. The teams might have maintained the databases for years, or they might have inherited them through acquisitions or other circumstances. Whatever the reasons, they likely manage and update the databases in much the same way they always have, performing their tasks mostly segregated from other efforts. But this approach makes it difficult for organizations to fully embrace modern application methodologies such as DevOps, which rely on more agile and aggressive application delivery processes.

The traditional methods for maintaining legacy databases can slow application delivery, impact productivity, and increase overall costs. Some organizations are starting to incorporate databases into their DevOps processes, but it’s not always an easy goal to achieve. Application and database development have historically been much different, and synchronizing changes can be a significant challenge. Another serious consideration is data persistence, which plays a major role in database updates, but is of little importance to the applications themselves.

Despite these issues, many organizations have come to recognize the importance of incorporating databases into the DevOps model. By applying the same principles to databases as those used for delivering applications, they can better control the entire application development and deployment process, including the databases that go with them. For organizations ready to make the transition, this article offers 10 strategies for incorporating legacy databases into their DevOps pipelines.

1. Choose your database strategy wisely.

Before embarking on your database DevOps journey, give careful consideration to how you should proceed. Start by identifying your business objectives and long-term strategies. In some cases, it might not be worth investing the time and resources necessary to apply DevOps to a legacy database. For example, the database might support applications that will soon be phased out, in which case, you might need to focus on how to migrate the data, rather than continuing to support the legacy database.

For those databases that you’ll continue to support, keep the overall scope in mind. It’s better to take one small step at a time than try to bring every database into the DevOps fold all at once. You should be planning for the long-term and not trying to do everything overnight. You might start with a small database to see how the process goes and then move on from there, or you might deploy a new database so your database team can become familiar with DevOps principles before tackling the legacy databases.

2. Create a DevOps culture that includes the database team.

Discussions around DevOps inevitably point to the importance of creating a culture of collaboration and transparency, one that fosters open communications for all individuals involved in application delivery. DevOps team members should be encouraged to work together and share in the responsibility for application delivery, adopting a mindset in which everyone has a stake in the outcome.

Not that long ago, DevOps teams rarely included database team members, leaving them out of the discussion altogether. For the most part, databases were seen as completely separate entities from the applications they served. However, the only way to successfully incorporate legacy databases into the DevOps process is to ensure that the database team is as much a part of the DevOps effort as the development and operations teams. The better the communications between the database team and everyone else, the smoother the transition to database DevOps.

3. Get the right tools in place.

DevOps teams need the right tools to support the continuous integration/continuous delivery (CI/CD) pipeline common to DevOps deployments. For example, a team might use Chef for configuration management, Jenkins for build automation, Kubernetes for deploying containers, or NUnit for unit testing. The exact tools depend on an organization’s specific requirements and the type of applications they’re deploying. DevOps teams should select their tools carefully, taking into account the growing need to support database deployments.

Initially, DevOps solutions tended to leave databases out of the equation, but that’s been steadily changing. Many DevOps tools now accommodate databases, and many database tools now accommodate DevOps. For example, Redgate Deploy can help incorporate a database into the CI/CD pipeline and can integrate with any CI server that supports PowerShell. Redgate Deploy can also integrate with common CI and release tools, including Jenkins, GitHub, TeamCity, or Azure DevOps.

4. Prepare the database team for DevOps.

DevOps methodologies require special skills to ensure that applications are properly built, tested, and deployed to their various environments. If developers also implement infrastructure as code (IaC), DevOps teams require skills in this area as well. In some cases, an organization might need to bring on additional personnel or outside consultants. For many DevOps teams, however, all they need is enough training and education to get them up and running. This is just as true for the database team as anyone else.

For example, individuals on the database team might focus on specific areas, such as Oracle database development or SQL Server administration. To prepare them for a transition to database DevOps, they should be trained in DevOps methodologies, particularly as they apply to database deployments. In this way, they’ll be much better prepared for the transition and able to understand why specific steps need to be taken. They’ll also be able to more effectively communicate with other DevOps team members about CI/CD operations.

5. Document your databases and data.

When preparing your databases for the transition to DevOps, you should ensure that they’re fully documented so that everyone on the DevOps team can quickly understand how a database’s objects are used and how they impact the application as a whole. Anyone looking at the documentation should be able to understand the types of data being stored, how the data is related, and any special considerations that might apply to the data.

One approach to documentation is to use the features built into the database platform. For example, SQL Server supports extended properties, which can be leveraged to document individual objects. In this way, the data definitions are stored alongside the schema definitions, providing immediate access to the information when needed. Better still, the extended properties also get checked into source control when the schema files are checked in (assuming that source control is being used, as it certainly should be).

6. Get databases into version control.

Many database teams are already putting their database script files into source control for maintaining file versions, often along with lookup or seed data. The databases might not participate in DevOps processes, but the source files are still protected. If a database team is not using source control, they’ll need to start. Version control is an essential part of DevOps methodologies. It offers one source of truth for all code files, resulting in fewer errors or code conflicts. It also provides a record of all changes, and it makes it possible to roll back changes to a previous version.

Database teams should store all their database code in source control in a product like SQL Source Control, without exception. This includes the scripts used to build the database as well as change scripts for modifying the schema. It also includes scripts used for creating stored procedures and user-defined functions, as well as any data modification scripts. In addition, it’s a good idea to store static data into source control, such as that used for lookup data, as well as configuration files when applicable. Source control check-ins should also incorporate code reviews to reduce the possibility of errors.

7. Prepare for data migration.

When you update an application, you don’t have to worry about persisting data from one version to the next. It’s not so easy with databases. You can’t simply update or replace schema without considering the impact on data. Even a minor update can result in lost or truncated data. With any database update, you must take into account how your changes will impact the existing data, what steps you must take to preserve that data, and how to apply any new or modified data to the updated database.

When preparing to incorporate your databases into DevOps, you need to have in place a system for ensuring that any new database versions get the data they need. If you’re doing an in-place schema update, you might also need scripts for modifying and preserving data. If you’re re-creating the database from scratch, you need the scripts necessary to populate the tables. In either case, those scripts should be checked into source control along with the schema changes, so they’re included in the CI/CD build process.

8. Shift left in your thinking.

DevOps methodologies include the concept of shifting left, which refers to the idea of performing certain tasks earlier in the application lifecycle. For example, instead of waiting to build the application and test its code until late in the development cycle, building and testing become an ongoing process that takes place as soon as updated script files are checked into source control. Also important to this process is that code check-ins occur frequently and in smaller chunks. The shift-left approach makes it easier to address issues sooner in the development process when they’re far more manageable.

The shift-left strategy should also be employed when incorporating databases into the DevOps pipeline, with developers checking in their script files on a frequent and ongoing basis. In addition, they should create database-specific tests, such as ones that verify object structure or query results. That way, when a developer checks in code changes, the CI server builds the database and runs the tests so that each change checked into source control is immediately verified. This doesn’t preclude other types of testing later in the cycle, but it helps catch certain issues earlier in the process when they’re much easier to resolve

9. Automate database operations.

Automation is a key component of a successful DevOps operation. This applies to the core application as well as the database. Automation helps avoid repetitive tasks, reduces the potential for errors, and ensures consistent results with each deployment. To this end, DevOps teams must ensure that the build, test, and deployment operations incorporate the database script files along with the application files. This might mean acquiring new CI/CD tools that can better accommodate databases or reconfiguring existing one to handle the additional requirements.

The key, of course, is to ensure that the database developers are checking their files into source control and following a consistent, agreed-upon process for managing script files. For example, they should decide whether to take a state-based approach or migration-based approach to deploying database updates. Although it’s possible to employ both strategies, choosing one over the other makes it easier to collaborate on a single code base while avoiding the complexities of balancing both at the same time.

10. Perform ongoing monitoring.

Like automation, continuous monitoring is an essential component of a successful DevOps operation. This includes monitoring the systems themselves, as well as providing continual feedback at every stage of the DevOps process. Comprehensive monitoring ensures that everything is functioning properly, while helping to identify issues early in the development cycle. Not only does this make for better applications, but it also helps to improve the CI/CD process itself.

Most DBAs are already familiar with the importance of monitoring their production database systems for performance, security, and compliance issues. In all likelihood, they already have the tools in place for identifying issues and determining which ones need immediate attention. The continuous feedback loop represents a different type of monitoring. It provides database developers with visibility across the entire pipeline and alerts them to any problems with their script files during the integration, testing, and deployment phases. For this reason, you must ensure that your pipeline’s feedback loop takes into account the database script files at every phase of the operation.

Looking to the future

All databases have a limited lifespan, and it’s just a matter of time before your legacy databases must be overhauled or replaced with another system. For example, you might find that a document database such as MongoDB will better serve an application than a relational database. Many organizations are also looking for ways to better accommodate the persistent data requirements that go with containers, microservices, and cloud-based applications.

No matter what direction your organization is heading, chances are it’s moving toward a goal of faster release cycles and greater agility. Database teams are under increasing pressure to adopt this strategy and take a more flexible approach to database maintenance. Instead of thinking of databases as unmovable monolithic structures, they must learn to treat them as manageable code that can be tested, automated, and deployed just like application code. DevOps can make this transition a lot easier, while also preparing database teams for what might be coming in the future.

 

Load comments

About the author

Robert Sheldon

See Profile

Robert is a freelance technology writer based in the Pacific Northwest. He’s worked as a technical consultant and has written hundreds of articles about technology for both print and online publications, with topics ranging from predictive analytics to 5D storage to the dark web. He’s also contributed to over a dozen books on technology, developed courseware for Microsoft’s training program, and served as a developmental editor on Microsoft certification exams. When not writing about technology, he’s working on a novel or venturing out into the spectacular Northwest woods.